Perceptrons with Polynomial Post-Processing
نویسندگان
چکیده
We introduce tensor product neural networks, composed of a layer of univariate neurons followed by U net of polynomial post-processing. We look at the general approximation problem by these networks observing in particular their relationship to the StoneWeierstrass theorem for uniform function algebras. The implementation of the post-processing as a two-layer network with logarithmic and exponential neurons leads to potentially important ‘generalised’ product networks, which however require a complex approximation theory of Miintz-Szasz-Ehrenpreis type. A back-propagation algorithm for product networks is presented and used in three computational experiments. In particular, approximation by a sigmoid product network is compared to that of a single layer radial basis network, and a multiple layer sigmoid network.
منابع مشابه
On Learning µ-Perceptron Networks with Binary Weights
Neural networks with binary weights are very important from both the theoretical and practical points of view. In this paper, we investigate the learnability of single binary perceptrons and unions of μ-binary-perceptron networks, i.e. an “OR” of binary perceptrons where each input unit is connected to one and only one perceptron. We give a polynomial time algorithm that PAC learns these networ...
متن کاملBounds on the Degree of High Order Binary Perceptrons Bounds on the Degree of High Order Binary Perceptrons
High order perceptrons are often used in order to reduce the size of neural networks. The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials. The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron,...
متن کاملBounds on the degree of high order binary perceptrons
High order perceptrons are often used in order to reduce the size of neural networks The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron wh...
متن کاملLearning Linear Threshold Approximations Using Perceptrons
We demonstrate suucient conditions for polynomial learnability of suboptimal linear threshold functions using perceptrons. The central result is as follows. Suppose there exists a vector ~ w of n weights (including the threshold) with \accuracy" 1?, \average error" , and \bal-ancing separation" , i.e., with probability 1 ? , ~ w correctly classiies an example ~ x; over examples incorrectly clas...
متن کاملA Pilot Sampling Method for Multi-layer Perceptrons
As the size of samples grows, the accuracy of trained multi-layer perceptrons grows with some improvement in error rates. But we cannot use larger and larger samples, because computational complexity to train the multi-layer perceptrons becomes enormous and data overfitting problem can happen. This paper suggests an effective approach in determining a proper sample size for multi-layer perceptr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- J. Exp. Theor. Artif. Intell.
دوره 12 شماره
صفحات -
تاریخ انتشار 1996